Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 28
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
2.
Appl Environ Microbiol ; 90(2): e0183523, 2024 Feb 21.
Artigo em Inglês | MEDLINE | ID: mdl-38214516

RESUMO

Even though differences in methodology (e.g., sample volume and detection method) have been shown to affect observed microbial water quality, multiple sampling and laboratory protocols continue to be used for water quality monitoring. Research is needed to determine how these differences impact the comparability of findings to generate best management practices and the ability to perform meta-analyses. This study addresses this knowledge gap by compiling and analyzing a data set representing 2,429,990 unique data points on at least one microbial water quality target (e.g., Salmonella presence and Escherichia coli concentration). Variance partitioning analysis was used to quantify the variance in likelihood of detecting each pathogenic target that was uniquely and jointly attributable to non-methodological versus methodological factors. The strength of the association between microbial water quality and select methodological and non-methodological factors was quantified using conditional forest and regression analysis. Fecal indicator bacteria concentrations were more strongly associated with non-methodological factors than methodological factors based on conditional forest analysis. Variance partitioning analysis could not disentangle non-methodological and methodological signals for pathogenic Escherichia coli, Salmonella, and Listeria. This suggests our current perceptions of foodborne pathogen ecology in water systems are confounded by methodological differences between studies. For example, 31% of total variance in likelihood of Salmonella detection was explained by methodological and/or non-methodological factors, 18% was jointly attributable to both methodological and non-methodological factors. Only 13% of total variance was uniquely attributable to non-methodological factors for Salmonella, highlighting the need for standardization of methods for microbiological water quality testing for comparison across studies.IMPORTANCEThe microbial ecology of water is already complex, without the added complications of methodological differences between studies. This study highlights the difficulty in comparing water quality data from projects that used different sampling or laboratory methods. These findings have direct implications for end users as there is no clear way to generalize findings in order to characterize broad-scale ecological phenomenon and develop science-based guidance. To best support development of risk assessments and guidance for monitoring and managing waters, data collection and methods need to be standardized across studies. A minimum set of data attributes that all studies should collect and report in a standardized way is needed. Given the diversity of methods used within applied and environmental microbiology, similar studies are needed for other microbiology subfields to ensure that guidance and policy are based on a robust interpretation of the literature.


Assuntos
Escherichia coli , Listeria , Microbiologia Ambiental , Salmonella , Alimentos , Microbiologia de Alimentos , Inocuidade dos Alimentos
3.
J Appl Microbiol ; 134(10)2023 Oct 04.
Artigo em Inglês | MEDLINE | ID: mdl-37709569

RESUMO

AIMS: While fecal indicator bacteria (FIB) testing is used to monitor surface water for potential health hazards, observed variation in FIB levels may depend on the scale of analysis (SOA). Two decades of citizen science data, coupled with random effects models, were used to quantify the variance in FIB levels attributable to spatial versus temporal factors. METHODS AND RESULTS: Separately, Bayesian models were used to quantify the ratio of spatial to non-spatial variance in FIB levels and identify associations between environmental factors and FIB levels. Separate analyses were performed for three SOA: waterway, watershed, and statewide. As SOA increased (from waterway to watershed to statewide models), variance attributable to spatial sources generally increased and variance attributable to temporal sources generally decreased. While relationships between FIB levels and environmental factors, such as flow conditions (base versus stormflow), were constant across SOA, the effect of land cover was highly dependent on SOA and consistently smaller than the effect of stormwater infrastructure (e.g. outfalls). CONCLUSIONS: This study demonstrates the importance of SOA when developing water quality monitoring programs or designing future studies to inform water management.


Assuntos
Ciência do Cidadão , Qualidade da Água , Monitoramento Ambiental/métodos , Teorema de Bayes , Escherichia coli , Microbiologia da Água , Fezes/microbiologia , Bactérias
4.
ISME Commun ; 3(1): 85, 2023 Aug 19.
Artigo em Inglês | MEDLINE | ID: mdl-37598265

RESUMO

Comprehending bacterial genomic variation linked to distinct environments can yield novel insights into mechanisms underlying differential adaptation and transmission of microbes across environments. Gaining such insights is particularly crucial for pathogens as it benefits public health surveillance. However, the understanding of bacterial genomic variation is limited by a scarcity of investigations in genomic variation coupled with different ecological contexts. To address this limitation, we focused on Listeria, an important bacterial genus for food safety that includes the human pathogen L. monocytogenes, and analyzed a large-scale genomic dataset collected by us from natural and food-associated environments across the United States. Through comparative genomics analyses on 449 isolates from the soil and 390 isolates from agricultural water and produce processing facilities representing L. monocytogenes, L. seeligeri, L. innocua, and L. welshimeri, we find that the genomic profiles strongly differ by environments within each species. This is supported by the environment-associated subclades and differential presence of plasmids, stress islands, and accessory genes involved in cell envelope biogenesis and carbohydrate transport and metabolism. Core genomes of Listeria species are also strongly associated with environments and can accurately predict isolation sources at the lineage level in L. monocytogenes using machine learning. We find that the large environment-associated genomic variation in Listeria appears to be jointly driven by soil property, climate, land use, and accompanying bacterial species, chiefly representing Actinobacteria and Proteobacteria. Collectively, our data suggest that populations of Listeria species have genetically adapted to different environments, which may limit their transmission from natural to food-associated environments.

5.
MMWR Morb Mortal Wkly Rep ; 72(15): 398-403, 2023 Apr 14.
Artigo em Inglês | MEDLINE | ID: mdl-37053122

RESUMO

As of December 31, 2022, a total of 29,939 monkeypox (mpox) cases* had been reported in the United States, 93.3% of which occurred in adult males. During May 10-December 31, 2022, 723,112 persons in the United States received the first dose in a 2-dose mpox (JYNNEOS)† vaccination series; 89.7% of these doses were administered to males (1). The current mpox outbreak has disproportionately affected gay, bisexual, and other men who have sex with men (MSM) and racial and ethnic minority groups (1,2). To examine racial and ethnic disparities in mpox incidence and vaccination rates, rate ratios (RRs) for incidence and vaccination rates and vaccination-to-case ratios were calculated, and trends in these measures were assessed among males aged ≥18 years (males) (3). Incidence in males in all racial and ethnic minority groups except non-Hispanic Asian (Asian) males was higher than that among non-Hispanic White (White) males. At the peak of the outbreak in August 2022, incidences among non-Hispanic Black or African American (Black) and Hispanic or Latino (Hispanic) males were higher than incidence among White males (RR = 6.9 and 4.1, respectively). Overall, vaccination rates were higher among males in racial and ethnic minority groups than among White males. However, the vaccination-to-case ratio was lower among Black (8.8) and Hispanic (16.2) males than among White males (42.5) during the full analytic period, indicating that vaccination rates among Black and Hispanic males were not proportionate to the elevated incidence rates (i.e., these groups had a higher unmet vaccination need). Efforts to increase vaccination among Black and Hispanic males might have resulted in the observed relative increased rates of vaccination; however, these increases were only partially successful in reducing overall incidence disparities. Continued implementation of equity-based vaccination strategies is needed to further increase vaccination rates and reduce the incidence of mpox among all racial and ethnic groups. Recent modeling data (4) showing that, based on current vaccination coverage levels, many U.S. jurisdictions are vulnerable to resurgent mpox outbreaks, underscore the need for continued vaccination efforts, particularly among racial and ethnic minority groups.


Assuntos
Minorias Sexuais e de Gênero , Masculino , Adulto , Humanos , Estados Unidos/epidemiologia , Adolescente , Etnicidade , Homossexualidade Masculina , Grupos Minoritários , Vacinação , Brancos
6.
Microbiol Spectr ; : e0038123, 2023 Mar 22.
Artigo em Inglês | MEDLINE | ID: mdl-36946722

RESUMO

The use of water contaminated with Salmonella for produce production contributes to foodborne disease burden. To reduce human health risks, there is a need for novel, targeted approaches for assessing the pathogen status of agricultural water. We investigated the utility of water microbiome data for predicting Salmonella contamination of streams used to source water for produce production. Grab samples were collected from 60 New York streams in 2018 and tested for Salmonella. Separately, DNA was extracted from the samples and used for Illumina shotgun metagenomic sequencing. Reads were trimmed and used to assign taxonomy with Kraken2. Conditional forest (CF), regularized random forest (RRF), and support vector machine (SVM) models were implemented to predict Salmonella contamination. Model performance was assessed using 10-fold cross-validation repeated 10 times to quantify area under the curve (AUC) and Kappa score. CF models outperformed the other two algorithms based on AUC (0.86, CF; 0.81, RRF; 0.65, SVM) and Kappa score (0.53, CF; 0.41, RRF; 0.12, SVM). The taxa that were most informative for accurately predicting Salmonella contamination based on CF were compared to taxa identified by ALDEx2 as being differentially abundant between Salmonella-positive and -negative samples. CF and differential abundance tests both identified Aeromonas salmonicida (variable importance [VI] = 0.012) and Aeromonas sp. strain CA23 (VI = 0.025) as the two most informative taxa for predicting Salmonella contamination. Our findings suggest that microbiome-based models may provide an alternative to or complement existing water monitoring strategies. Similarly, the informative taxa identified in this study warrant further investigation as potential indicators of Salmonella contamination of agricultural water. IMPORTANCE Understanding the associations between surface water microbiome composition and the presence of foodborne pathogens, such as Salmonella, can facilitate the identification of novel indicators of Salmonella contamination. This study assessed the utility of microbiome data and three machine learning algorithms for predicting Salmonella contamination of Northeastern streams. The research reported here both expanded the knowledge on the microbiome composition of surface waters and identified putative novel indicators (i.e., Aeromonas species) for Salmonella in Northeastern streams. These putative indicators warrant further research to assess whether they are consistent indicators of Salmonella contamination across regions, waterways, and years not represented in the data set used in this study. Validated indicators identified using microbiome data may be used as targets in the development of rapid (e.g., PCR-based) detection assays for the assessment of microbial safety of agricultural surface waters.

7.
J Food Prot ; 86(3): 100045, 2023 03.
Artigo em Inglês | MEDLINE | ID: mdl-36916552

RESUMO

Surface water environments are inherently heterogenous, and little is known about variation in microbial water quality between locations. This study sought to understand how microbial water quality differs within and between Virginia ponds. Grab samples were collected twice per week from 30 sampling sites across nine Virginia ponds (n = 600). Samples (100 mL) were enumerated for total coliform (TC) and Escherichia coli (EC) levels, and physicochemical, weather, and environmental data were collected. Bayesian models of coregionalization were used to quantify the variance in TC and EC levels attributable to spatial (e.g., site, pond) versus nonspatial (e.g., date, pH) sources. Mixed-effects Bayesian regressions and conditional inference trees were used to characterize relationships between data and TC or EC levels. Analyses were performed separately for each pond with ≥3 sampling sites (5 intrapond) while one interpond model was developed using data from all sampling sites and all ponds. More variance in TC levels were attributable to spatial opposed to nonspatial sources for the interpond model (variance ratio [VR] = 1.55) while intrapond models were pond dependent (VR: 0.65-18.89). For EC levels, more variance was attributable to spatial sources in the interpond model (VR = 1.62), compared to all intrapond models (VR < 1.0) suggesting that more variance is attributable to nonspatial factors within individual ponds and spatial factors when multiple ponds are considered. Within each pond, TC and EC levels were spatially independent for sites 56-87 m apart, indicating that different sites within the same pond represent different water quality for risk management. Rainfall was positively and pH negatively associated with TC and EC levels in both inter- and intrapond models. For all other factors, the direction and strength of associations varied. Factors driving microbial dynamics in ponds appear to be pond-specific and differ depending on the spatial scale considered.


Assuntos
Irrigação Agrícola , Lagoas , Lagoas/microbiologia , Teorema de Bayes , Bactérias , Qualidade da Água , Escherichia coli
8.
Appl Environ Microbiol ; 89(2): e0152922, 2023 02 28.
Artigo em Inglês | MEDLINE | ID: mdl-36728439

RESUMO

The heterogeneity of produce production environments complicates the development of universal strategies for managing preharvest produce safety risks. Understanding pathogen ecology in different produce-growing regions is important for developing targeted mitigation strategies. This study aimed to identify environmental and spatiotemporal factors associated with isolating Salmonella and Listeria from environmental samples collected from 10 Virginia produce farms. Soil (n = 400), drag swab (n = 400), and irrigation water (n = 120) samples were tested for Salmonella and Listeria, and results were confirmed by PCR. Salmonella serovar and Listeria species were identified by the Kauffmann-White-Le Minor scheme and partial sigB sequencing, respectively. Conditional forest analysis and Bayesian mixed models were used to characterize associations between environmental factors and the likelihood of isolating Salmonella, Listeria monocytogenes (LM), and other targets (e.g., Listeria spp. and Salmonella enterica serovar Newport). Surrogate trees were used to visualize hierarchical associations identified by the forest analyses. Salmonella and LM prevalence was 5.3% (49/920) and 2.3% (21/920), respectively. The likelihood of isolating Salmonella was highest in water samples collected from the Eastern Shore of Virginia with a dew point of >9.4°C. The likelihood of isolating LM was highest in water samples collected in winter from sites where <36% of the land use within 122 m was forest wetland cover. Conditional forest results were consistent with the mixed models, which also found that the likelihood of detecting Salmonella and LM differed between sample type, region, and season. These findings identified factors that increased the likelihood of isolating Salmonella- and LM-positive samples in produce production environments and support preharvest mitigation strategies on a regional scale. IMPORTANCE This study sought to examine different growing regions across the state of Virginia and to determine how factors associated with pathogen prevalence may differ between regions. Spatial and temporal data were modeled to identify factors associated with an increased pathogen likelihood in various on-farm sources. The findings of the study show that prevalence of Salmonella and L. monocytogenes is low overall in the produce preharvest environment but does vary by space (e.g., region in Virginia) and time (e.g., season), and the likelihood of pathogen-positive samples is influenced by different spatial and temporal factors. Therefore, the results support regional or scale-dependent food safety standards and guidance documents for controlling hazards to minimize risk. This study also suggests that water source assessments are important tools for developing monitoring programs and mitigation measures, as spatiotemporal factors differ on a regional scale.


Assuntos
Listeria monocytogenes , Fazendas , Listeria monocytogenes/genética , Prevalência , Virginia/epidemiologia , Teorema de Bayes , Salmonella/genética
9.
MMWR Morb Mortal Wkly Rep ; 71(45): 1449-1456, 2022 Nov 11.
Artigo em Inglês | MEDLINE | ID: mdl-36355615

RESUMO

On May 17, 2022, the Massachusetts Department of Health announced the first suspected case of monkeypox associated with the global outbreak in a U.S. resident. On May 23, 2022, CDC launched an emergency response (1,2). CDC's emergency response focused on surveillance, laboratory testing, medical countermeasures, and education. Medical countermeasures included rollout of a national JYNNEOS vaccination strategy, Food and Drug Administration (FDA) issuance of an emergency use authorization to allow for intradermal administration of JYNNEOS, and use of tecovirimat for patients with, or at risk for, severe monkeypox. During May 17-October 6, 2022, a total of 26,384 probable and confirmed* U.S. monkeypox cases were reported to CDC. Daily case counts peaked during mid-to-late August. Among 25,001 of 25,569 (98%) cases in adults with information on gender identity,† 23,683 (95%) occurred in cisgender men. Among 13,997 cisgender men with information on recent sexual or close intimate contact,§ 10,440 (75%) reported male-to-male sexual contact (MMSC) ≤21 days preceding symptom onset. Among 21,211 (80%) cases in persons with information on race and ethnicity,¶ 6,879 (32%), 6,628 (31%), and 6,330 (30%) occurred in non-Hispanic Black or African American (Black), Hispanic or Latino (Hispanic), and non-Hispanic White (White) persons, respectively. Among 5,017 (20%) cases in adults with information on HIV infection status, 2,876 (57%) had HIV infection. Prevention efforts, including vaccination, should be prioritized among persons at highest risk within groups most affected by the monkeypox outbreak, including gay, bisexual, and other men who have sex with men (MSM); transgender, nonbinary, and gender-diverse persons; racial and ethnic minority groups; and persons who are immunocompromised, including persons with advanced HIV infection or newly diagnosed HIV infection.


Assuntos
Infecções por HIV , Minorias Sexuais e de Gênero , Adulto , Estados Unidos/epidemiologia , Humanos , Masculino , Feminino , Homossexualidade Masculina , Etnicidade , Infecções por HIV/prevenção & controle , Grupos Minoritários , Identidade de Gênero , Causas de Morte , Surtos de Doenças
10.
Appl Environ Microbiol ; 88(23): e0160022, 2022 12 13.
Artigo em Inglês | MEDLINE | ID: mdl-36409131

RESUMO

While growers have reported pressures to minimize wildlife intrusion into produce fields through noncrop vegetation (NCV) removal, NCV provides key ecosystem services. To model food safety and environmental tradeoffs associated with NCV removal, published and publicly available food safety and water quality data from the Northeastern United States were obtained. Because data on NCV removal are not widely available, forest-wetland cover was used as a proxy, consistent with previous studies. Structural equation models (SEMs) were used to quantify the effect of forest-wetland cover on (i) food safety outcomes (e.g., detecting pathogens in soil) and (ii) water quality (e.g., nutrient levels). Based on the SEMs, NCV was not associated with or had a protective effect on food safety outcomes (more NCV was associated with a reduced likelihood of pathogen detection). The probabilities of detecting Listeria spp. in soil (effect estimate [EE] = -0.17; P = 0.005) and enterohemorrhagic Escherichia coli in stream samples (EE = -0.27; P < 0.001) were negatively associated with the amount of NCV surrounding the sampling site. Larger amounts of NCV were also associated with lower nutrient, salinity, and sediment levels, and higher dissolved oxygen levels. Total phosphorous levels were negatively associated with the amount of NCV in the upstream watershed (EE = -0.27; P < 0.001). Similar negative associations (P < 0.05) were observed for other physicochemical parameters, such as nitrate (EE = -0.38). Our findings suggest that NCV should not be considered an inherent produce safety risk or result in farm audit demerits. This study also provides a framework for evaluating environmental tradeoffs associated with using specific preharvest food safety strategies. IMPORTANCE Currently, on-farm food safety decisions are typically made independently of conservation considerations, often with detrimental impacts on agroecosystems. Comanaging agricultural environments to simultaneously meet conservation and food safety aims is complicated because farms are closely linked to surrounding environments, and management decisions can have unexpected environmental, economic, and food safety consequences. Thus, there is a need for research on the conservation and food safety tradeoffs associated with implementing specific preharvest food safety practices. Understanding these tradeoffs is critical for developing adaptive comanagement strategies and ensuring the short- and long-term safety, sustainability, and profitability of agricultural systems. This study quantifies tradeoffs and synergies between food safety and environmental aims, and outlines a framework for modeling tradeoffs and synergies between management aims that can be used to support future comanagement research.


Assuntos
Ecossistema , Qualidade da Água , Fazendas , Inocuidade dos Alimentos , Agricultura , Solo
11.
Appl Environ Microbiol ; 88(23): e0101522, 2022 12 13.
Artigo em Inglês | MEDLINE | ID: mdl-36377948

RESUMO

Commercial leafy greens customers often require a negative preharvest pathogen test, typically by compositing 60 produce sample grabs of 150 to 375 g total mass from lots of various acreages. This study developed a preharvest sampling Monte Carlo simulation, validated it against literature and experimental trials, and used it to suggest improvements to sampling plans. The simulation was validated by outputting six simulated ranges of positive samples that contained the experimental number of positive samples (range, 2 to 139 positives) recovered from six field trials with point source, systematic, and sporadic contamination. We then evaluated the relative performance between simple random, stratified random, or systematic sampling in a 1-acre field to detect point sources of contamination present at 0.3% to 1.7% prevalence. Randomized sampling was optimal because of lower variability in probability of acceptance. Optimized sampling was applied to detect an industry-relevant point source [3 log(CFU/g) over 0.3% of the field] and widespread contamination [-1 to -4 log(CFU/g) over the whole field] by taking 60 to 1,200 sample grabs of 3 g. More samples increased the power of detecting point source contamination, as the median probability of acceptance decreased from 85% with 60 samples to 5% with 1,200 samples. Sampling plans with larger total composite sample mass increased power to detect low-level, widespread contamination, as the median probability of acceptance with -3 log(CFU/g) contamination decreased from 85% with a 150-g total mass to 30% with a 1,200-g total mass. Therefore, preharvest sampling power increases by taking more, smaller samples with randomization, up to the constraints of total grabs and mass feasible or required for a food safety objective. IMPORTANCE This study addresses a need for improved preharvest sampling plans for pathogen detection in leafy green fields by developing and validating a preharvest sampling simulation model, avoiding the expensive task of physical sampling in many fields. Validated preharvest sampling simulations were used to develop guidance for preharvest sampling protocols. Sampling simulations predicted that sampling plans with randomization are less variable in their power to detect low-prevalence point source contamination in a 1-acre field. Collecting larger total sample masses improved the power of sampling plans in detecting widespread contamination in 1-acre fields. Hence, the power of typical sampling plans that collect 150 to 375 g per composite sample can be improved by taking more, randomized smaller samples for larger total sample mass. The improved sampling plans are subject to feasibility constraints or to meet a particular food safety objective.


Assuntos
Contaminação de Alimentos , Inocuidade dos Alimentos , Contaminação de Alimentos/análise , Folhas de Planta , Simulação por Computador , Microbiologia de Alimentos , Contagem de Colônia Microbiana
12.
Front Microbiol ; 13: 768527, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35847115

RESUMO

Freshwater bodies receive waste, feces, and fecal microorganisms from agricultural, urban, and natural activities. In this study, the probable sources of fecal contamination were determined. Also, antibiotic resistant bacteria (ARB) were detected in the two main rivers of central Chile. Surface water samples were collected from 12 sampling sites in the Maipo (n = 8) and Maule Rivers (n = 4) every 3 months, from August 2017 until April 2019. To determine the fecal contamination level, fecal coliforms were quantified using the most probable number (MPN) method and the source of fecal contamination was determined by Microbial Source Tracking (MST) using the Cryptosporidium and Giardia genotyping method. Separately, to determine if antimicrobial resistance bacteria (AMB) were present in the rivers, Escherichia coli and environmental bacteria were isolated, and the antibiotic susceptibility profile was determined. Fecal coliform levels in the Maule and Maipo Rivers ranged between 1 and 130 MPN/100-ml, and 2 and 30,000 MPN/100-ml, respectively. Based on the MST results using Cryptosporidium and Giardia host-specific species, human, cattle, birds, and/or dogs hosts were the probable sources of fecal contamination in both rivers, with human and cattle host-specific species being more frequently detected. Conditional tree analysis indicated that coliform levels were significantly associated with the river system (Maipo versus Maule), land use, and season. Fecal coliform levels were significantly (p < 0.006) higher at urban and agricultural sites than at sites immediately downstream of treatment centers, livestock areas, or natural areas. Three out of eight (37.5%) E. coli isolates presented a multidrug-resistance (MDR) phenotype. Similarly, 6.6% (117/1768) and 5.1% (44/863) of environmental isolates, in Maipo and Maule River showed and MDR phenotype. Efforts to reduce fecal discharge into these rivers should thus focus on agriculture and urban land uses as these areas were contributing the most and more frequently to fecal contamination into the rivers, while human and cattle fecal discharges were identified as the most likely source of this fecal contamination by the MST approach. This information can be used to design better mitigation strategies, thereby reducing the burden of waterborne diseases and AMR in Central Chile.

13.
J Appl Microbiol ; 132(3): 2342-2354, 2022 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-34637586

RESUMO

AIMS: This study investigated Salmonella concentrations following combinations of horticultural practices including anaerobic soil disinfestation (ASD), soil amendment type and irrigation regimen. METHODS AND RESULTS: Sandy-loam soil was inoculated with a five-serovar Salmonella cocktail (5.5 ± 0.2 log CFU per gram) and subjected to one of six treatments: (i) no soil amendment, ASD (ASD control), (ii) no soil amendment, no-ASD (non-ASD control) and (iii-vi) soil amended with pelletized poultry litter, rye, rapeseed or hairy vetch with ASD. The effect of irrigation regimen was determined by collecting samples 3 and 7 days after irrigation. Twenty-five-gram soil samples were collected pre-ASD, post-soil saturation (i.e. ASD-process), and at 14 time-points post-ASD, and Salmonella levels enumerated. Log-linear models examined the effect of amendment type and irrigation regimen on Salmonella die-off during and post-ASD. During ASD, Salmonella concentrations significantly decreased in all treatments (range: -0.2 to -2.7 log CFU per gram), albeit the smallest decrease (-0.2 log CFU per gram observed in the pelletized poultry litter) was of negligible magnitude. Salmonella die-off rates varied by amendment with an average post-ASD rate of -0.05 log CFU per gram day (CI = -0.05, -0.04). Salmonella concentrations remained highest over the 42 days post-ASD in pelletized poultry litter, followed by rapeseed, and hairy vetch treatments. Findings suggested ASD was not able to eliminate Salmonella in soil, and certain soil amendments facilitated enhanced Salmonella survival. Salmonella serovar distribution differed by treatment with pelletized poultry litter supporting S. Newport survival, compared with other serovars. Irrigation appeared to assist Salmonella survival with concentrations being 0.14 log CFU per gram (CI = 0.05, 0.23) greater 3 days, compared with 7 days post-irrigation. CONCLUSIONS: ASD does not eliminate Salmonella in soil, and may in fact, depending on the soil amendment used, facilitate Salmonella survival. SIGNIFICANCE AND IMPACT OF THE STUDY: Synergistic and antagonistic effects on food safety hazards of implementing horticultural practices should be considered.


Assuntos
Microbiologia do Solo , Solo , Irrigação Agrícola , Agricultura/métodos , Anaerobiose , Salmonella
14.
Nat Microbiol ; 6(8): 1021-1030, 2021 08.
Artigo em Inglês | MEDLINE | ID: mdl-34267358

RESUMO

Natural bacterial populations can display enormous genomic diversity, primarily in the form of gene content variation caused by the frequent exchange of DNA with the local environment. However, the ecological drivers of genomic variability and the role of selection remain controversial. Here, we address this gap by developing a nationwide atlas of 1,854 Listeria isolates, collected systematically from soils across the contiguous United States. We found that Listeria was present across a wide range of environmental parameters, being mainly controlled by soil moisture, molybdenum and salinity concentrations. Whole-genome data from 594 representative strains allowed us to decompose Listeria diversity into 12 phylogroups, each with large differences in habitat breadth and endemism. 'Cosmopolitan' phylogroups, prevalent across many different habitats, had more open pangenomes and displayed weaker linkage disequilibrium, reflecting higher rates of gene gain and loss, and allele exchange than phylogroups with narrow habitat ranges. Cosmopolitan phylogroups also had a large fraction of genes affected by positive selection. The effect of positive selection was more pronounced in the phylogroup-specific core genome, suggesting that lineage-specific core genes are important drivers of adaptation. These results indicate that genome flexibility and recombination are the consequence of selection to survive in variable environments.


Assuntos
Genoma Bacteriano , Listeria/genética , Seleção Genética , Microbiologia do Solo , Ecossistema , Evolução Molecular , Listeria/classificação , Listeria/isolamento & purificação , Filogenia , Recombinação Genética
16.
MMWR Morb Mortal Wkly Rep ; 70(22): 818-824, 2021 Jun 04.
Artigo em Inglês | MEDLINE | ID: mdl-34081685

RESUMO

Disparities in vaccination coverage by social vulnerability, defined as social and structural factors associated with adverse health outcomes, were noted during the first 2.5 months of the U.S. COVID-19 vaccination campaign, which began during mid-December 2020 (1). As vaccine eligibility and availability continue to expand, assuring equitable coverage for disproportionately affected communities remains a priority. CDC examined COVID-19 vaccine administration and 2018 CDC social vulnerability index (SVI) data to ascertain whether inequities in COVID-19 vaccination coverage with respect to county-level SVI have persisted, overall and by urbanicity. Vaccination coverage was defined as the number of persons aged ≥18 years (adults) who had received ≥1 dose of any Food and Drug Administration (FDA)-authorized COVID-19 vaccine divided by the total adult population in a specified SVI category.† SVI was examined overall and by its four themes (socioeconomic status, household composition and disability, racial/ethnic minority status and language, and housing type and transportation). Counties were categorized into SVI quartiles, in which quartile 1 (Q1) represented the lowest level of vulnerability and quartile 4 (Q4), the highest. Trends in vaccination coverage were assessed by SVI quartile and urbanicity, which was categorized as large central metropolitan, large fringe metropolitan (areas surrounding large cities, e.g., suburban), medium and small metropolitan, and nonmetropolitan counties.§ During December 14, 2020-May 1, 2021, disparities in vaccination coverage by SVI increased, especially in large fringe metropolitan (e.g., suburban) and nonmetropolitan counties. By May 1, 2021, vaccination coverage was lower among adults living in counties with the highest overall SVI; differences were most pronounced in large fringe metropolitan (Q4 coverage = 45.0% versus Q1 coverage = 61.7%) and nonmetropolitan (Q4 = 40.6% versus Q1 = 52.9%) counties. Vaccination coverage disparities were largest for two SVI themes: socioeconomic status (Q4 = 44.3% versus Q1 = 61.0%) and household composition and disability (Q4 = 42.0% versus Q1 = 60.1%). Outreach efforts, including expanding public health messaging tailored to local populations and increasing vaccination access, could help increase vaccination coverage in high-SVI counties.


Assuntos
Vacinas contra COVID-19/administração & dosagem , Disparidades em Assistência à Saúde/estatística & dados numéricos , População Urbana/estatística & dados numéricos , Cobertura Vacinal/estatística & dados numéricos , Populações Vulneráveis/estatística & dados numéricos , Adulto , COVID-19/epidemiologia , COVID-19/prevenção & controle , Cidades/epidemiologia , Humanos , Fatores Socioeconômicos , Estados Unidos/epidemiologia
17.
Front Artif Intell ; 4: 628441, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-34056577

RESUMO

Since E. coli is considered a fecal indicator in surface water, government water quality standards and industry guidance often rely on E. coli monitoring to identify when there is an increased risk of pathogen contamination of water used for produce production (e.g., for irrigation). However, studies have indicated that E. coli testing can present an economic burden to growers and that time lags between sampling and obtaining results may reduce the utility of these data. Models that predict E. coli levels in agricultural water may provide a mechanism for overcoming these obstacles. Thus, this proof-of-concept study uses previously published datasets to train, test, and compare E. coli predictive models using multiple algorithms and performance measures. Since the collection of different feature data carries specific costs for growers, predictive performance was compared for models built using different feature types [geospatial, water quality, stream traits, and/or weather features]. Model performance was assessed against baseline regression models. Model performance varied considerably with root-mean-squared errors and Kendall's Tau ranging between 0.37 and 1.03, and 0.07 and 0.55, respectively. Overall, models that included turbidity, rain, and temperature outperformed all other models regardless of the algorithm used. Turbidity and weather factors were also found to drive model accuracy even when other feature types were included in the model. These findings confirm previous conclusions that machine learning models may be useful for predicting when, where, and at what level E. coli (and associated hazards) are likely to be present in preharvest agricultural water sources. This study also identifies specific algorithm-predictor combinations that should be the foci of future efforts to develop deployable models (i.e., models that can be used to guide on-farm decision-making and risk mitigation). When deploying E. coli predictive models in the field, it is important to note that past research indicates an inconsistent relationship between E. coli levels and foodborne pathogen presence. Thus, models that predict E. coli levels in agricultural water may be useful for assessing fecal contamination status and ensuring compliance with regulations but should not be used to assess the risk that specific pathogens of concern (e.g., Salmonella, Listeria) are present.

18.
Artigo em Inglês | MEDLINE | ID: mdl-33999788

RESUMO

A total of 27 Listeria isolates that could not be classified to the species level were obtained from soil samples from different locations in the contiguous United States and an agricultural water sample from New York. Whole-genome sequence-based average nucleotide identity blast (ANIb) showed that the 27 isolates form five distinct clusters; for each cluster, all draft genomes showed ANI values of <95 % similarity to each other and any currently described Listeria species, indicating that each cluster represents a novel species. Of the five novel species, three cluster with the Listeria sensu stricto clade and two cluster with sensu lato. One of the novel sensu stricto species, designated L. cossartiae sp. nov., contains two subclusters with an average ANI similarity of 94.9%, which were designated as subspecies. The proposed three novel sensu stricto species (including two subspecies) are Listeria farberi sp. nov. (type strain FSL L7-0091T=CCUG 74668T=LMG 31917T; maximum ANI 91.9 % to L. innocua), Listeria immobilis sp. nov. (type strain FSL L7-1519T=CCUG 74666T=LMG 31920T; maximum ANI 87.4 % to L. ivanovii subsp. londoniensis) and Listeria cossartiae sp. nov. [subsp. cossartiae (type strain FSL L7-1447T=CCUG 74667T=LMG 31919T; maximum ANI 93.4 % to L. marthii) and subsp. cayugensis (type strain FSL L7-0993T=CCUG 74670T=LMG 31918T; maximum ANI 94.7 % to L. marthii). The two proposed novel sensu lato species are Listeria portnoyi sp. nov. (type strain FSL L7-1582T=CCUG 74671T=LMG 31921T; maximum ANI value of 88.9 % to L. cornellensis and 89.2 % to L. newyorkensis) and Listeria rustica sp. nov. (type strain FSL W9-0585T=CCUG 74665T=LMG 31922T; maximum ANI value of 88.7 % to L. cornellensis and 88.9 % to L. newyorkensis). L. immobilis is the first sensu stricto species isolated to date that is non-motile. All five of the novel species are non-haemolytic and negative for phosphatidylinositol-specific phospholipase C activity; the draft genomes lack the virulence genes found in Listeria pathogenicity island 1 (LIPI-1), and the internalin genes inlA and inlB, indicating that they are non-pathogenic.


Assuntos
Irrigação Agrícola , Listeria/classificação , Filogenia , Microbiologia da Água , Técnicas de Tipagem Bacteriana , Composição de Bases , DNA Bacteriano/genética , Ácidos Graxos/química , Genes Bacterianos , Listeria/isolamento & purificação , New York , RNA Ribossômico 16S/genética , Análise de Sequência de DNA
19.
Front Microbiol ; 12: 590303, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-33796083

RESUMO

The use of untreated biological soil amendments of animal origin (BSAAO) have been identified as one potential mechanism for the dissemination and persistence of Salmonella in the produce growing environment. Data on factors influencing Salmonella concentration in amended soils are therefore needed. The objectives here were to (i) compare die-off between 12 Salmonella strains following inoculation in amended soil and (ii) characterize any significant effects associated with soil-type, irrigation regimen, and amendment on Salmonella survival and die-off. Three greenhouse trials were performed using a randomized complete block design. Each strain (~4 log CFU/g) was homogenized with amended or non-amended sandy-loam or clay-loam soil. Salmonella levels were enumerated in 25 g samples 0, 0.167 (4 h), 1, 2, 4, 7, 10, 14, 21, 28, 56, 84, 112, 168, 210, 252, and 336 days post-inoculation (dpi), or until two consecutive samples were enrichment negative. Regression analysis was performed between strain, soil-type, irrigation, and (i) time to last detect (survival) and (ii) concentration at each time-point (die-off rate). Similar effects of strain, irrigation, soil-type, and amendment were identified using the survival and die-off models. Strain explained up to 18% of the variance in survival, and up to 19% of variance in die-off rate. On average Salmonella survived for 129 days in amended soils, however, Salmonella survived, on average, 30 days longer in clay-loam soils than sandy-loam soils [95% Confidence interval (CI) = 45, 15], with survival time ranging from 84 to 210 days for the individual strains during daily irrigation. When strain-specific associations were investigated using regression trees, S. Javiana and S. Saintpaul were found to survive longer in sandy-loam soil, whereas most of the other strains survived longer in clay-loam soil. Salmonella also survived, on average, 128 days longer when irrigated weekly, compared to daily (CI = 101, 154), and 89 days longer in amended soils, than non-amended soils (CI = 61, 116). Overall, this study provides insight into Salmonella survival following contamination of field soils by BSAAO. Specifically, Salmonella survival may be strain-specific as affected by both soil characteristics and management practices. These data can assist in risk assessment and strain selection for use in challenge and validation studies.

20.
MMWR Morb Mortal Wkly Rep ; 70(12): 431-436, 2021 Mar 26.
Artigo em Inglês | MEDLINE | ID: mdl-33764963

RESUMO

The U.S. COVID-19 vaccination program began in December 2020, and ensuring equitable COVID-19 vaccine access remains a national priority.* COVID-19 has disproportionately affected racial/ethnic minority groups and those who are economically and socially disadvantaged (1,2). Thus, achieving not just vaccine equality (i.e., similar allocation of vaccine supply proportional to its population across jurisdictions) but equity (i.e., preferential access and administra-tion to those who have been most affected by COVID-19 disease) is an important goal. The CDC social vulnerability index (SVI) uses 15 indicators grouped into four themes that comprise an overall SVI measure, resulting in 20 metrics, each of which has national and state-specific county rankings. The 20 metric-specific rankings were each divided into lowest to highest tertiles to categorize counties as low, moderate, or high social vulnerability counties. These tertiles were combined with vaccine administration data for 49,264,338 U.S. residents in 49 states and the District of Columbia (DC) who received at least one COVID-19 vaccine dose during December 14, 2020-March 1, 2021. Nationally, for the overall SVI measure, vaccination coverage was higher (15.8%) in low social vulnerability counties than in high social vulnerability counties (13.9%), with the largest coverage disparity in the socioeconomic status theme (2.5 percentage points higher coverage in low than in high vulnerability counties). Wide state variations in equity across SVI metrics were found. Whereas in the majority of states, vaccination coverage was higher in low vulnerability counties, some states had equitable coverage at the county level. CDC, state, and local jurisdictions should continue to monitor vaccination coverage by SVI metrics to focus public health interventions to achieve equitable coverage with COVID-19 vaccine.


Assuntos
Vacinas contra COVID-19/administração & dosagem , Disparidades em Assistência à Saúde/estatística & dados numéricos , Características de Residência/estatística & dados numéricos , Cobertura Vacinal/estatística & dados numéricos , Populações Vulneráveis , COVID-19/epidemiologia , COVID-19/prevenção & controle , Humanos , Programas de Imunização , Avaliação de Programas e Projetos de Saúde , Fatores Socioeconômicos , Estados Unidos/epidemiologia
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...